AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Chinese Pretraining

# Chinese Pretraining

Minirbt H288
Apache-2.0
MiniRBT is a Chinese small pretrained model developed based on knowledge distillation technology, optimized for training efficiency using Whole Word Masking.
Large Language Model Transformers Chinese
M
hfl
405
8
Chinese Roberta L 8 H 256
A Chinese RoBERTa model pretrained on CLUECorpusSmall, with 8 layers and 512 hidden units, suitable for various Chinese NLP tasks.
Large Language Model Chinese
C
uer
15
1
T5 Small Chinese Cluecorpussmall
A small Chinese T5 model pretrained using the UER-py framework, adopting a unified text-to-text format to handle various Chinese NLP tasks
Large Language Model Chinese
T
uer
1,336
19
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase